A Guide for Sparse PCA: Model Comparison and Applications
نویسندگان
چکیده
منابع مشابه
Deflation Methods for Sparse PCA
In analogy to the PCA setting, the sparse PCA problem is often solved by iteratively alternating between two subtasks: cardinality-constrained rank-one variance maximization and matrix deflation. While the former has received a great deal of attention in the literature, the latter is seldom analyzed and is typically borrowed without justification from the PCA context. In this work, we demonstra...
متن کاملMinimax Rates for Sparse PCA
We state below two results that we use frequently in our proofs. The first is well-known consequence of the CS decomposition. It relates the canonical angles between subspaces to the singular values of products and differences of their corresponding projection matrices.
متن کاملOptimal Sparse Linear Encoders and Sparse PCA
Principal components analysis (PCA) is the optimal linear encoder of data. Sparse linear encoders (e.g., sparse PCA) produce more interpretable features that can promote better generalization. (i) Given a level of sparsity, what is the best approximation to PCA? (ii) Are there efficient algorithms which can achieve this optimal combinatorial tradeoff? We answer both questions by providing the f...
متن کاملA Hitchhiker’s Guide to PCA and CCA
2 Principal Component Analysis (PCA) Consider a random variable X ∈ R which is an n-dimensional vector. We want to derive a lower dimensional variable X ∈ R (m ≤ n) to represent X. Intuitively, X should capture as much information about X as possible in these fewer dimensions. PCA finds X = (X1 . . . Xm) such that each component Xi ∈ R is a one-dimensional projection of X with maximum variance ...
متن کاملSparse Phase Retrieval via Sparse PCA Despite Model Misspecification: A Simplified and Extended Analysis
We consider the problem of high-dimensional misspecified phase retrieval. This is where we have an s-sparse signal vector x∗ in R , which we wish to recover using sampling vectors a1, . . . , am, and measurements y1, . . . , ym, which are related by the equation f(〈ai,x∗〉) = yi. Here, f is an unknown link function satisfying a positive correlation with the quadratic function. This problem was r...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Psychometrika
سال: 2021
ISSN: 0033-3123,1860-0980
DOI: 10.1007/s11336-021-09773-2